particle physic
- North America > United States (0.16)
- North America > Canada (0.04)
- Asia > Middle East > Israel (0.04)
SUPA: A Lightweight Diagnostic Simulator for Machine Learning in Particle Physics
Deep learning methods have gained popularity in high energy physics for fast modeling of particle showers in detectors. Detailed simulation frameworks such as the gold standard \textsc{Geant4} are computationally intensive, and current deep generative architectures work on discretized, lower resolution versions of the detailed simulation. The development of models that work at higher spatial resolutions is currently hindered by the complexity of the full simulation data, and by the lack of simpler, more interpretable benchmarks. Our contribution is \textsc{SUPA}, the SUrrogate PArticle propagation simulator, an algorithm and software package for generating data by simulating simplified particle propagation, scattering and shower development in matter. The generation is extremely fast and easy to use compared to \textsc{Geant4}, but still exhibits the key characteristics and challenges of the detailed simulation. The proposed simulator generates thousands of particle showers per second on a desktop machine, a speed up of up to 6 orders of magnitudes over \textsc{Geant4}, and stores detailed geometric information about the shower propagation.
Quantum computers turned out to be more useful than expected in 2025
For the past year, I kept bringing the same story to my editor: quantum computers are on the edge of becoming useful for scientific discovery. Of course, that has always been the goal. The idea of using quantum computers to better understand our universe is part of their origin story, and it even featured in a 1981 speech by Richard Feynman. Contemplating the best way to simulate nature, he wrote: "We can give up on our rule about what the computer was, we can say: Let the computer itself be built of quantum mechanical elements which obey quantum mechanical laws." Today, Feynman's vision has been realised by Google, IBM and dozens more companies and academic teams. Their devices are now being used to simulate reality at the quantum level - and here are some highlights.
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.05)
- North America > United States > Maryland (0.05)
- North America > Canada > Ontario > Waterloo Region > Waterloo (0.05)
- Information Technology > Hardware (1.00)
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence (1.00)
Inside the wild experiments physicists would do with zero limits
From a particle smasher encircling the moon to an "impossible" laser, five scientists reveal the experiments they would run in a world powered purely by imagination In physics, breakthroughs are rare. Experiments are slow, expensive and often end up refining, rather than rewriting, our understanding of the universe. But what if the only constraint on scientific ambition were imagination? We asked five physicists to describe the kind of experiment they would do if they didn't have to worry about budgets, engineering limitations or political realities. Not because we expect any of it to happen soon - though in a few cases, momentum is building - but because it is revealing to see where their minds go when the usual boundaries are stripped away. One researcher wants to launch radio telescopes deep into space to probe dark matter with cosmic energy flashes.
- Health & Medicine (0.47)
- Energy (0.47)
An interpretable unsupervised representation learning for high precision measurement in particle physics
Lv, Xing-Jian, Miao, De-Xing, Xu, Zi-Jun, Wang, Jian-Chun
Machine learning, and in particular its modern incarnation of deep learning (DL) [1, 2], has become an indispensable tool in particle physics, a field that routinely handles vast datasets and nonlinear relationships among observables [3-6]. In recent years, advances in DL have expanded the scope of data-driven progress across the energy, intensity, accelerator, and cosmic frontiers [7, 8]. Despite remarkable advancements, most current DL applications in particle physics are supervised, relying either on Monte Carlo (MC) simulations or on labeled experimental data. However, because simulations cannot fully capture the complexity of the real world, a persistent gap between MC and Data leads to training bias. Direct training on real data, in turn, demands extensive human labeling, which is labor-intensive and hard to scale [9]. For this reason, the development of unsupervised DL [10, 11] is integral for particle physics. Unsupervised learning has achieved remarkable success in tasks such as clustering [12, 13], anomaly detection [14, 15], and learning representations [16, 17].
- Asia > China > Beijing > Beijing (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- North America > United States (0.16)
- North America > Canada (0.04)
- Asia > Middle East > Israel (0.04)
SUPA: A Lightweight Diagnostic Simulator for Machine Learning in Particle Physics
Deep learning methods have gained popularity in high energy physics for fast modeling of particle showers in detectors. Detailed simulation frameworks such as the gold standard \textsc{Geant4} are computationally intensive, and current deep generative architectures work on discretized, lower resolution versions of the detailed simulation. The development of models that work at higher spatial resolutions is currently hindered by the complexity of the full simulation data, and by the lack of simpler, more interpretable benchmarks. Our contribution is \textsc{SUPA}, the SUrrogate PArticle propagation simulator, an algorithm and software package for generating data by simulating simplified particle propagation, scattering and shower development in matter. The generation is extremely fast and easy to use compared to \textsc{Geant4}, but still exhibits the key characteristics and challenges of the detailed simulation.
A Step Toward Interpretability: Smearing the Likelihood
The problem of interpretability of machine learning architecture in particle physics has no agreed-upon definition, much less any proposed solution. We present a first modest step toward these goals by proposing a definition and corresponding practical method for isolation and identification of relevant physical energy scales exploited by the machine. This is accomplished by smearing or averaging over all input events that lie within a prescribed metric energy distance of one another and correspondingly renders any quantity measured on a finite, discrete dataset continuous over the dataspace. Within this approach, we are able to explicitly demonstrate that (approximate) scaling laws are a consequence of extreme value theory applied to analysis of the distribution of the irreducible minimal distance over which a machine must extrapolate given a finite dataset. As an example, we study quark versus gluon jet identification, construct the smeared likelihood, and show that discrimination power steadily increases as resolution decreases, indicating that the true likelihood for the problem is sensitive to emissions at all scales.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States > New York > Suffolk County > Hauppauge (0.04)
- Asia > Japan > Honshū > Tōhoku > Fukushima Prefecture > Fukushima (0.04)
Large Physics Models: Towards a collaborative approach with Large Language Models and Foundation Models
Barman, Kristian G., Caron, Sascha, Sullivan, Emily, de Regt, Henk W., de Austri, Roberto Ruiz, Boon, Mieke, Färber, Michael, Fröse, Stefan, Hasibi, Faegheh, Ipp, Andreas, Kapoor, Rukshak, Kasieczka, Gregor, Kostić, Daniel, Krämer, Michael, Golling, Tobias, Lopez, Luis G., Marco, Jesus, Otten, Sydney, Pawlowski, Pawel, Vischia, Pietro, Weber, Erik, Weniger, Christoph
This paper explores ideas and provides a potential roadmap for the development and evaluation of physics-specific large-scale AI models, which we call Large Physics Models (LPMs). These models, based on foundation models such as Large Language Models (LLMs) - trained on broad data - are tailored to address the demands of physics research. LPMs can function independently or as part of an integrated framework. This framework can incorporate specialized tools, including symbolic reasoning modules for mathematical manipulations, frameworks to analyse specific experimental and simulated data, and mechanisms for synthesizing theories and scientific literature. We begin by examining whether the physics community should actively develop and refine dedicated models, rather than relying solely on commercial LLMs. We then outline how LPMs can be realized through interdisciplinary collaboration among experts in physics, computer science, and philosophy of science. To integrate these models effectively, we identify three key pillars: Development, Evaluation, and Philosophical Reflection. Development focuses on constructing models capable of processing physics texts, mathematical formulations, and diverse physical data. Evaluation assesses accuracy and reliability by testing and benchmarking. Finally, Philosophical Reflection encompasses the analysis of broader implications of LLMs in physics, including their potential to generate new scientific understanding and what novel collaboration dynamics might arise in research. Inspired by the organizational structure of experimental collaborations in particle physics, we propose a similarly interdisciplinary and collaborative approach to building and refining Large Physics Models. This roadmap provides specific objectives, defines pathways to achieve them, and identifies challenges that must be addressed to realise physics-specific large scale AI models.
- Europe > Netherlands > South Holland > Leiden (0.04)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.04)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
- (9 more...)
HPCNeuroNet: A Neuromorphic Approach Merging SNN Temporal Dynamics with Transformer Attention for FPGA-based Particle Physics
Isik, Murat, Vishwamith, Hiruna, Naoukin, Jonathan, Dikmen, I. Can
This paper presents the innovative HPCNeuroNet model, a pioneering fusion of Spiking Neural Networks (SNNs), Transformers, and high-performance computing tailored for particle physics, particularly in particle identification from detector responses. Our approach leverages SNNs' intrinsic temporal dynamics and Transformers' robust attention mechanisms to enhance performance when discerning intricate particle interactions. At the heart of HPCNeuroNet lies the integration of the sequential dynamism inherent in SNNs with the context-aware attention capabilities of Transformers, enabling the model to precisely decode and interpret complex detector data. HPCNeuroNet is realized through the HLS4ML framework and optimized for deployment in FPGA environments. The model accuracy and scalability are also enhanced by this architectural choice. Benchmarked against machine learning models, HPCNeuroNet showcases better performance metrics, underlining its transformative potential in high-energy physics. We demonstrate that the combination of SNNs, Transformers, and FPGA-based high-performance computing in particle physics signifies a significant step forward and provides a strong foundation for future research.
- North America > United States > Texas > Travis County > Austin (0.04)
- Asia > Sri Lanka (0.04)
- Asia > Middle East > Republic of Türkiye > Adana Province > Adana (0.04)